The Lancet Regional Health - Western Pacific
○ Elsevier BV
Preprints posted in the last 7 days, ranked by how well they match The Lancet Regional Health - Western Pacific's content profile, based on 15 papers previously published here. The average preprint has a 0.01% match score for this journal, so anything above that is already an above-average fit.
Ogwel, B.; Awuor, A. O.; Onyando, B. O.; Ochieng, R.; Hossain, M. J.; Conteh, B.; Mujahid, W.; Shaheen, F.; Munthali, V.; Malemia, T.; Tapia, M.; Keita, A. M.; Nasrin, D.; Kosek, M. N.; Qadri, F.; Kotloff, K. L.; Pavlinac, P. B.; McQuade, E. T. R.
Show abstract
Although the co-occurrence of diarrhea and malnutrition is well documented, research has largely focused on the acute management of diarrheal illness. Despite its importance, longitudinal evidence characterizing post-diarrheal recovery trajectories is sparse. We sought to characterize post-diarrheal nutritional recovery trajectories among children aged 6-35 months who were malnourished at enrollment using data from the Enterics for Global Health (EFGH) Shigella Surveillance study (2022-2024). EFGH enrolled children aged 6-35 months presenting with medically-attended diarrhea and followed them at 4 weeks and 3 months post-enrollment. This analysis included children with baseline wasting, stunting, or underweight (z-score < -2) and complete anthropometric follow-up. Latent class mixed-effects models were used to identify distinct post-diarrheal growth trajectories based on changes in anthropometric z-scores over time. Multinomial modified Poisson regression models examined associations between baseline factors and trajectory membership. Among 9,480 enrolled children, 16.5% (n=1,561) were wasted, 22.7% (n=2,155) stunted, and 21.0% (n=1,994) underweight at baseline. Wasting showed greater recovery potential (80.8%) compared with stunting (38.5%) and underweight (40.3%). Recovery was shaped by factors across multiple levels. Clinical severity markers ( prolonged diarrhea, dehydration, and hypoxemia) increased the risk of nutritional failure. Age also influenced outcomes: infants were more likely to worsen, whereas older toddlers more often experienced stagnation. Interventions including exclusive breastfeeding, oral rehydration therapy, appropriate antibiotics, and zinc supplementation, improved outcomes, while unimproved sanitation undermined recovery. These findings highlight the need for integrated strategies combining infection control, nutritional rehabilitation, and water, sanitation, and hygiene interventions tailored to the childrens developmental stage. Key MessagesO_LIPost-diarrheal nutritional recovery is highly heterogeneous, with wasting showing the greatest potential for improvement, while stunting and underweight often result in persistent growth stagnation. C_LIO_LIBaseline anthropometric deficits alone are insufficient to predict recovery, highlighting the need for dynamic monitoring and individualized management. C_LIO_LIInfants are particularly vulnerable to acute nutritional deterioration, while older toddlers frequently experience growth stagnation. C_LIO_LIModifiable protective factors including exclusive breastfeeding, ORS, zinc, and appropriate antibiotics, improved outcomes, whereas poor sanitation undermined recovery. C_LIO_LIIntegrated strategies, tailored to a childs developmental stage, combining clinical care, nutrition, and environmental interventions are critical to support sustained child growth and development. C_LI
Essex, R.; Lim, S.; Jagnoor, J.
Show abstract
Drowning remains a major global public health challenge, yet how built environment characteristics shape population-level drowning risk remains poorly understood. This study linked satellite-derived built environment data to subnational drowning mortality estimates across 203 regions in 12 countries from 2006-2021. It found that built environment associations with drowning mortality are complex, non-linear, and shaped by development context. Urban extent was strongly protective, while built area near water showed protection overall but increased risk when combined with high population crowding. Almost all drowning mortality variance occurred between regions rather than within regions over time, indicating risk is predominantly determined by place-based characteristics. Income-stratified analyses revealed profound heterogeneity: crowding was protective in low-to middle-income settings but near-null in high-income regions, while waterfront development captured very different realities across contexts. These findings highlight the importance of tailoring drowning prevention strategies to local built environment configurations and development contexts.
Essex, R.; Lim, S.; Jagnoor, J.
Show abstract
BackgroundDrowning remains a major global public health challenge. This study examined whether the timing and trajectories of urbanisation--beyond the current built environment--are associated with subnational drowning mortality. MethodsWe linked satellite-derived measures of built-environment change (GHSL), population crowding (WorldPop), surface water exposure (JRC Global Surface Water), and infrastructure proxies (VIIRS/DMSP nighttime lights) to GBD 2021 drowning mortality estimates across 203 ADM1 regions in 12 countries (2006-2021; 3,248 region-year observations). Temporal predictors captured recent expansion, development "newness" ([≤]10-year built share), acceleration/volatility, and a crowdingxgrowth interaction. We screened predictors using LASSO (10-fold cross-validation) and fitted mixed-effects models with region random intercepts. Distributed-lag models tested temporal precedence and development age, and income-stratified models assessed heterogeneity. ResultsAdding temporal predictors improved fit beyond contemporaneous built-environment measures ({Delta}AIC=177; {Delta}BIC=147). In adjusted models, crowdingxgrowth was strongly positively associated with drowning mortality, and a higher share of recent development was associated with higher mortality. Lag models showed a development age gradient: older built environment was most protective. Associations differed by income group, with several key coefficients reversing sign across strata. DiscussionDrowning mortality appears shaped by development histories as well as present-day conditions, with risk concentrated in rapidly changing, dense settings and the newest built environments. Cross-context heterogeneity suggests mechanisms and prevention priorities are unlikely to be uniform. ConclusionsDevelopment timing and trajectories help explain subnational drowning mortality beyond current built form alone. Prevention and planning should prioritise transition-period safety strategies in newly developing and rapidly densifying areas.
Sawadogo, J. W.; Hema, A.; Diarra, A.; Kabore, J. M.; Hien, D.; Kouraogo, L.; Zou, A. R.; Ouedraogo, A. Z.; Tiono, A. B.; Datta, S.; Pasetti, M. F.; Neuzil, K. M.; Sirima, S. B.; Ouedraogo, A.; Laurens, M. B.
Show abstract
Typhoid fever remains a significant public health challenge in low- and middle-income countries. In 2018, The World Health Organization recommended a single dose typhoid conjugate vaccine (TCV) for routine immunization in endemic settings; however, evidence guiding booster doses remains limited. Homologous TCV booster doses have demonstrated immune boosting. This study assessed the immunogenicity and safety of a heterologous booster using a Vi capsular polysaccharide-CRM197 TCV (Vi-CRM) administered 5-6 years after primary vaccination with a Vi capsular polysaccharide tetanus toxoid TCV (Vi-TT) in children. Children previously enrolled in a Phase 2 trial were recruited. Participants who had received TCV at 9-11 or 15-23 months were given a Vi-CRM booster at 6-7 years of age (Booster-TCV group), and controls received their first TCV dose at the same age (1st-TCV group). Serum anti-Vi IgG concentrations were measured at baseline and 28 days post-vaccination. Solicited and unsolicited adverse events (AEs) and serious adverse events (SAEs) were recorded. Among 147 children enrolled, 87 received a second and 60 received a first TCV dose. Baseline anti-Vi IgG geometric mean titers (GMT) were higher in the Booster-TCV group (21.5 EU/mL; 95% CI: 17.2-26.8) than in the 1st-TCV group (5.5 EU/mL; 95% CI: 4.5-6.7). At day 28, GMTs rose markedly in both groups: 5140.0 EU/mL (95% CI: 4302.0-6141.3) in the Booster-TCV group and 2084.8 EU/mL (95% CI: 1724.4-2520.5) in the 1st-TCV group. Local reactions and systemic AEs were mild. No SAEs were observed. Vi-TT-induced immunity persisted for at least 5-6 years, and a heterologous booster triggered a strong immune response with universal seroconversion. These findings support heterologous prime-boost strategies to maintain protection in school-age children and inform optimization of TCV schedules in endemic regions.
Koskei, G.; Karanja, S.; Ndugu, Z. W.; Anino, C. O.
Show abstract
Child undernutrition remains a major public health challenge in Kenya. Suboptimal feeding practices contribute significantly to persistent underweight and stunting. This study evaluated the effect of a community-based Positive Deviance Hearth (PDH) intervention on feeding practices among children aged 6-59 months in Sub County within a County of study. The study adopted a two-group pretest-posttest randomized experimental study design conducted for six months period, among 84 caregiver-child pairs in intervention and control groups. A multi-stage sampling was employed to identify study settings and participants. Structured and pretested questionnaires, 24-hour food recall questionnaires and meal diversity questionnaires were used for data collection at pre-intervention and post-intervention periods. Data was analyzed using R software v.4.5.2. The differences between intervention and control groups at baseline and endline were assessed using difference-in-difference analysis, relevantly summarized using adjusted DID estimates, 95% confidence intervals and p-values, with p<0.05 considered significant. The PDH intervention significantly improved feeding practices among children 6-59 months. Meal frequency increased for 9-23 months (DiD = +1.4; 95% CI: 1.2-1.7; p = 0.034) and 24 months and above (DiD = +1.2; 95% CI: 1.1-1.5; p = 0.017), and dietary diversity rose (DiD = +1.3; 95% CI: 1.1-1.9; p < 0.001). Nutrient-dense food consumption improved, including legumes (DiD = +32.6%; p < 0.001) and animal-source foods (DiD = +35.4%; p < 0.001). Energy and protein intake increased across all age groups (p < 0.05), and micronutrients iron, vitamin A, vitamin C also rose significantly (p < 0.05). The PDH intervention substantially improved caregiver feeding practices, increased dietary diversity, and enhanced macro- and micronutrient intake, demonstrating its effectiveness as a scalable, community-driven strategy for sustainably improving child nutrition in high-burden settings.
Bui, L. V.; Nguyen, D. N.
Show abstract
Background. Vietnam's disease burden has shifted from communicable, maternal, neonatal, and nutritional (CMNN) causes to non-communicable diseases (NCDs), but the tempo, drivers, and regional positioning of this transition have not been jointly quantified. We characterised Vietnam's epidemiological transition 1990-2023 against ten Southeast-Asian (SEA) peers. Methods. Using Global Burden of Disease 2023 data, we computed joinpoint-regression AAPC with 95% CI (BIC-penalised, up to three break-points) for age-standardised DALY rates and cause-composition shares. We applied Das Gupta three-factor decomposition to 1990-2023 absolute DALY change (population-size, age-structure, age-specific-rate effects) and benchmarked Vietnam's NCD share against an SDI-conditional peer trajectory via leave-one-out quadratic regression. Premature mortality was quantified as WHO 30q70 under both broad NCD and strict SDG 3.4.1 definitions, using Chiang II life-table adjustment identically across all eleven countries. Findings. The CMNN age-standardised DALY rate fell from 13,295.9 to 4,022.1 per 100,000 (AAPC -4.63%/year; 95% CI -4.80 to -4.46); the NCD rate fell only from 21,688.2 to 19,282.8 (AAPC -0.37; -0.45 to -0.30). NCD share of total DALYs rose from 52.99% to 70.67% (+17.67 pp; AAPC +1.09). Vietnam ranked fourth of eleven SEA countries in 2023 (up from sixth in 1990) and sat 5.3% above the SDI-expected trajectory. Das Gupta decomposition attributed the +10.63 million NCD DALY increase to population growth (+6.26 M) and ageing (+6.08 M); rate change removed only 1.71 M. Premature NCD mortality fell from 25.02% to 21.80% (broad, 12.9% reduction) and from 22.17% to 19.50% (SDG 3.4.1, 12.0%; Vietnam sixth of eleven) - far short of the SDG 3.4 one-third-reduction target. Interpretation. Vietnam has entered a disability- and ageing-dominated NCD phase. Meeting SDG 3.4 by 2030 requires population-scale primary prevention sized to demographic momentum.
Nguyen, A. T.; Nankabirwa, J. I.; Kakuru, A.; Roh, M. E.; Aguti, M.; Adrama, H.; Kizza, J.; Olwoch, P.; Kamya, M. R.; Dorsey, G.; Jagannathan, P.; Benjamin-Chung, J.
Show abstract
Introduction: Intermittent preventive treatment in pregnancy (IPTp) with sulfadoxine-pyrimethamine (SP) has become less effective at preventing malaria due to rising parasite resistance. IPTp with dihydroartemisinin-piperaquine (DP) alone or in combination with SP (DP+SP) dramatically lowers the risk of malaria in pregnancy compared to SP but is associated with lower birthweight and early life wasting. We estimated the effect of IPTp-DP, DP+SP, and SP on infant growth outcomes and assessed possible treatment mechanisms through a causal mediation analysis. Methods: We used infant follow-up data (N=761) from a trial (NCT04336189) that randomized pregnant women to receive monthly IPTp-DP, SP, or DP+SP. We compared weight-for-length (WLZ) and length-for-age (LAZ) z-scores between treatment arms. We assessed possible mediation through pregnancy, birth, and infancy factors using interventional indirect effect models. Results: Compared to IPTp-SP, IPTp-DP+SP decreased mean WLZ by 0.18 [95% confidence interval (CI) -0.03, 0.39] between 1-3 months and 0.28 (95% CI 0.07, 0.49) between 4-6 months, with the largest differences among primigravidae. Lower risk of active placental malaria in IPTp-DP+SP helped reduce differences in mean WLZ vs IPTp-SP (+0.06, 95% CI 0.02, 0.10). The IPTp-DP+SP arm had up to 0.28 lower mean LAZ between 7-13 months compared to IPTp-DP, particularly among children who were wasted between 0-6 months; low birthweight had a persistent, mediating effect on linear growth. Conclusion: Adverse birth outcomes contributed to early growth faltering among children born to mothers receiving IPTp-DP+SP vs IPTp-SP, but the prevention of placental malaria partially counteracted the negative effects of IPTp-DP+SP on ponderal growth.
Tchum, E. K. Y.; Koto, J. E.; Kanyoke, F.; Opoku, O.; Ussher, F.; Dassah, S. D.; Amoani, B.; Tchum, S. K.; Nyarko, E.
Show abstract
Background Affecting 40% of infants and young children worldwide, anaemia in sub-Saharan Africa hampers cognitive and physical development, often in ways that cannot be reversed. Iron-based micronutrient powders (MNPs) are recommended to combat anaemia, but concerns remain about their safety and effectiveness in malaria-endemic areas. We evaluated the impact of iron-based MNPs on growth measurements and malaria-related anaemia among preschool children in Ghana. Methods We conducted a secondary analysis of a cluster-randomized, double-blind, placebo-controlled trial in the Bono Region, Ghana. Children aged 6-35 months (n=1,958) received daily MNP containing 12{middle dot}5mg elemental iron or placebo for five months. Anthropometric indices, haemoglobin, and malaria parasitaemia were assessed at baseline and endline. Adjusted analysis of covariance (ANCOVA) models estimated effects on height-for-age (HAZ), weight-for-age (WAZ), and weight-for-height (WHZ) z scores. Binomial regression with identity link estimated risk differences for malaria-induced anaemia. Cluster-robust standard errors were applied at the compound level, and intracluster correlation coefficients (ICCs) were estimated. Results 1,815 (92{middle dot}7%) children completed the endline survey, but 1,806 were included in the final analysis. Baseline characteristics were balanced between groups. Iron-containing MNP had no significant effect on endline HAZ ({beta}=0{middle dot}026, p=0{middle dot}609), WAZ ({beta}=-0{middle dot}015, p=0{middle dot}719), or WHZ ({beta}=-0{middle dot}035, p=0{middle dot}463). However, the intervention reduced the risk of malaria-induced anaemia (risk difference 0{middle dot}050, 95% CI 0{middle dot}004-0{middle dot}096; p=0{middle dot}032). Female sex was associated with higher HAZ ({beta}=0{middle dot}149, p=0{middle dot}005). Conclusion Iron-containing MNP did not improve short-term growth but was associated with a modest reduction in malaria-induced anaemia. These findings support the safe use of iron fortification in malaria-endemic settings while underscoring the need for integrated strategies to address persistent growth faltering and gender specificity.
Essex, R.; Lim, S.; Jagnoor, J.
Show abstract
IntroductionDrowning risk begins with water exposure, yet population-water relationships have rarely been quantified at scale using environmental measures. This study explored whether satellite-derived data was associated with subnational drowning mortality and whether associations differed by country income level. MethodsWe linked Global Burden of Disease (GBD 2021) age-standardised drowning mortality rates to satellite-derived exposures for 212 subnational regions across 12 countries (2006-2021; 3,392 region-years). Exposures were extracted via Google Earth Engine and standardised. Gamma-log generalised linear mixed models included region random intercepts and year fixed effects. Income-stratified models were estimated separately. Supplementary models assessed maritime vessel activity. ResultsNear-water population percentage was the strongest correlate of drowning (IRR 1.40; 95% CI 1.33-1.47). Permanent water coverage was protective (IRR 0.80; 0.73-0.88), as were nighttime lights (IRR 0.96; 0.95-0.97) and hot days [≥]30{degrees}C (IRR 0.95; 0.92-0.99). Mean temperature (IRR 1.17; 1.11-1.23) and precipitation (IRR 1.03; 1.01-1.04) were positively associated. Near-water effects were consistent across income strata (LIC 1.25; MIC 1.31; HIC 1.24), while other predictors showed weak or inconsistent within-strata associations. Vessel activity was modestly associated with drowning in Global Fishing Watch models (IRR 1.05; 1.01-1.09) but not in Synthetic Aperture Radar models. DiscussionSatellite-derived indicators can characterise drowning risk at scale, with population proximity to water emerging as a robust cross-context correlate. Protective associations for permanent water suggest landscape configuration may shape risk beyond proximity alone, highlighting geospatial datas value for targeting prevention where surveillance is limited.
Hassell, N.; Marcenac, P.; Bationo, C. S.; Hirve, S.; Tempia, S.; Rolfes, M. A.; Duca, L. M.; Hammond, A.; Wijesinghe, P. R.; Heraud, J.-M.; Pereyaslov, D.; Zhang, W.; Kondor, R. J.; Azziz-Baumgartner, E.
Show abstract
Introduction: Modeling when influenza epidemics typically occur can help countries optimize surveillance, time clinical and public health interventions, and reduce the burden of influenza. Methods: We used influenza virus detections reported during 2011-2024 by 180 countries to the Global Influenza Surveillance and Response System, excluding COVID-19 pandemic impacted years (2020-2023). We analyzed data by calendar year (week 1-52) or shifted year (week 30-29) time windows, based on when most influenza detections occurred in each country. For countries with sufficient data, we computed generalized additive models (GAMs) of each country's weekly influenza-positive tests to smooth and impute time series distributions. From these GAMs, we calculated each country's normalized weekly influenza burden. Country-specific normalized time series were grouped using hierarchical k-means clustering reducing the Euclidean distance between time series within clusters. We calculated cluster-specific GAMs to estimate average seasonal timing. Countries without sufficient data were assigned to a cluster based on population-weighted latitudinal distance to a cluster's mean latitude. Results: We identified five clusters, or epidemic zones, from 111 countries with sufficient data. The influenza burden in epidemic zones A and B was consistent with a northern hemisphere pattern, with most influenza detections occurring during October-April (A) and September-March (B), while epidemic zones D and E were characterized by southern hemisphere-like seasonal timing, with most influenza burden occurring during May-November. Epidemic zone C had most influenza burden occurring during September-March; most countries assigned to this cluster were in the tropics. Conclusion: Epidemic zones may serve as a useful tool to strengthen and optimize influenza surveillance for global health decision-making (e.g., during vaccine strain composition discussions) and to guide country preparedness efforts for seasonal influenza epidemics, including the timing of enhanced surveillance, as well as the procurement and delivery of vaccines and antivirals.
Wongnak, P.; Chaisiri, K.; Perrone, C.; Chalvet-Monfray, K.; Areechokchai, D.; Pan-ngum, W.
Show abstract
BackgroundScrub typhus is a major yet neglected vector-borne disease in Thailand, where it has been nationally notifiable for over two decades. However, long-term changes in its epidemiology, including reporting rates, transmission intensity, disease severity, and seasonal patterns, have not been comprehensively characterised at the national level. MethodologyWe analysed 22 years of national surveillance data for scrub typhus in Thailand (2003-2024) using a latent process model that jointly fits reported cases with published nationwide seroprevalence data and antibody kinetics to estimate reporting rates and underlying transmission dynamics across all 77 provinces of Thailand. FindingsOver the 22-year study period, 143096 cases and 119 deaths were reported nationally. Estimated reporting proportion broadly mirrored transmission intensity, being higher in high-burden regions and lower elsewhere. A synchronous decline in detection was observed across all regions during the COVID-19 pandemic, followed by rapid rebound by 2024. After accounting for these reporting dynamics, the force of infection was highest in the northern provinces but also substantial in the northeast and south, with upward trends in some provinces. Susceptibility among older adults aged 65 and above increased progressively over the study period, reversing the pattern observed two decades earlier. Case-fatality in the 25-35-year reference group was low and declined from 0.14% (95% Credible Interval [CrI]: 0.06-0.29%) to 0.06% (95% CrI: 0.02-0.12%), but relative case-fatality remained consistently highest among adults above 65 across all periods. Three geographically distinct seasonal patterns were identified, all stable over time. ConclusionOver two decades, scrub typhus transmission in Thailand has been shown to extend well beyond its traditionally recognised northern focus, with substantial burden in previously underappreciated regions, while the demographic profile of those most affected has shifted progressively toward older adults. These findings support the need for regionally tailored surveillance, age-targeted clinical preparedness, and sustained investment in understanding the ecological drivers of transmission. Key messagesScrub typhus is a common but neglected cause of fever in Thailand, where it has been reported through the national surveillance system for over two decades. However, trends in reported cases can be misleading because they reflect not only true changes in transmission but also variation in diagnosis and reporting over time and across regions. We developed a model that combines surveillance data with seroprevalence surveys and antibody kinetics to separate true changes in transmission from variation in reporting, allowing us to estimate how transmission intensity, disease severity, and seasonal patterns have evolved from 2003 to 2024 across all 77 provinces. We found that substantial transmission occurs not only in the well-studied northern provinces but also in the northeast and south, where the disease has received less attention. Susceptibility has progressively shifted toward older adults, who also face the highest case-fatality, while three distinct seasonal patterns vary by region but have remained stable over time. These findings suggest that scrub typhus control in Thailand requires a shift from a predominantly northern focus toward regionally tailored strategies that account for local transmission timing, an ageing at-risk population, and the ecological drivers that sustain transmission in each setting.
DeCuir, J.; Reeves, E. L.; Weber, Z. A.; Yang, D.-H.; Irving, S. A.; Tartof, S. Y.; Klein, N. P.; Grannis, S. J.; Ong, T. C.; Ball, S. W.; DeSilva, M. B.; Dascomb, K.; Naleway, A. L.; Koppolu, P.; Salas, S. B.; Sy, L. S.; Lewin, B.; Contreras, R.; Zerbo, O.; Hansen, J. R.; Block, L.; Jacobson, K. B.; Dixon, B. E.; Rogerson, C.; Duszynski, T.; Fadel, W. F.; Barron, M. A.; Mayer, D.; Chavez, C.; Yates, A.; Kirshner, L.; McEvoy, C. E.; Akinsete, O. O.; Essien, I. J.; Sheffield, T.; Bride, D.; Arndorfer, J.; Van Otterloo, J.; Natarajan, K.; Ray, C. S.; Payne, A. B.; Adams, K.; Flannery, B.; Garg,
Show abstract
Background: The 2024-25 influenza season was the most severe in the United States (US) since 2017-18, with co-circulation of both influenza A virus subtypes (H1N1 and H3N2). Influenza vaccine effectiveness (VE) has varied by season, setting, and patient characteristics. Methods: Using electronic healthcare encounter data from eight US states, we evaluated influenza vaccine effectiveness (VE) against influenza-associated hospitalizations and emergency department or urgent care (ED/UC) encounters from October 2024-April 2025 among children aged 6 months-17 years and adults aged 18+ years. Using a test-negative, case-control design, we compared the odds of influenza vaccination between acute respiratory illness (ARI) encounters with a positive (cases) versus negative (controls) test for influenza by molecular assay, adjusting for confounders. Results: Analyses included 108,618 encounters (5,764 hospitalizations and 102,854 ED/UC encounters) among children and 309,483 encounters (76,072 hospitalizations and 233,411 ED/UC encounters) among adults. Among children across care settings, 17.0% (6,097/35,765) of cases versus 29.4% (21,449/72,853) of controls were vaccinated. Among adults, 28.2% (21,832/77,477) of cases versus 44.2% (102,560/232,006) of controls were vaccinated. VE was 51% (95% confidence interval [95% CI]: 41-60%) against influenza-associated hospitalizations and 54% (95% CI: 52-55%) against influenza-associated ED/UC encounters among children. VE was 43% (95% CI: 41-46%) against influenza-associated hospitalizations and 49% (95% CI: 47-50%) against influenza-associated ED/UC encounters among adults. Conclusions: Influenza vaccination provided protection against influenza-associated hospitalizations and ED/UC encounters among children and adults in the US during the severe 2024-25 influenza season. These findings support influenza vaccination as an important tool to reduce influenza-associated disease.
Conte Cortez Martins, G.; Lutwama, J. J.; Owor, N.; Namulondo, J.; Ross, J. E.; Lu, X.; Asasira, I.; Kiyingi, T.; Nsereko, C.; Nsubuga, J. B.; Shinyale, J.; Kiwubeyi, M.; Nankwanga, R.; Nie, K.; Reynolds, S. J.; Kayiwa, J.; Kim-Schulze, S.; Bakamutumaho, B.; Cummings, M.
Show abstract
ObjectiveStudies of nutritional status and host responses during severe and critical illness have focused predominantly on obesity; in contrast, the relationship between undernutrition, host responses, and clinical outcomes in adults hospitalized with severe infection remains poorly defined. We sought to determine whether severe undernutrition is associated with distinct host responses and clinical outcomes in adults hospitalized with severe infection. DesignProspective cohort study. SettingTwo public referral hospitals in Uganda. PatientsNon-pregnant adults ([≥]18 yr) hospitalized with severe, undifferentiated infection. InterventionsNone. Measurements and Main ResultsWe analyzed clinical data and serum Olink proteomic data from 432 participants (median age, 45 yr [IQR, 31-57 yr]; 44% male). Overall, 213 participants (49%) met prespecified criteria for undernutrition, including 52 (12%) with severe undernutrition. Clinically, severe undernutrition was associated with HIV coinfection, microbiologically diagnosed tuberculosis, greater physiological instability, and higher mortality. After adjustment for age, sex, illness duration, study site, and HIV, malaria, and tuberculosis coinfection, severe undernutrition was associated with higher expression of proteins involved in pro-inflammatory immune signaling, endothelial and vascular remodeling, hypoxia and oxidative stress responses, and extracellular matrix remodeling, together with lower expression of proteins linked to growth signaling, anticoagulant regulation, and lipid homeostasis. ConclusionsSevere undernutrition is associated with a distinct high-risk clinical phenotype and biologic signature in adults hospitalized with severe infection. These findings suggest that undernutrition may potentiate key domains of sepsis pathobiology, with implications for strengthening nutritional support and informing host-directed treatment strategies in low- and middle-income countries where malnutrition is common. Key PointsO_ST_ABSQuestionC_ST_ABSHow does undernutrition influence immune, metabolic, and endothelial responses to severe infection in adults? FindingsIn this multicenter cohort study of 432 adults hospitalized with severe infection in Uganda, severe undernutrition was associated with greater physiologic instability, higher mortality, and a distinct proteomic host-response profile. Adults with severe undernutrition exhibited a proteomic signature characterized by pro-inflammatory immune signaling, endothelial and extracellular matrix remodeling, and hypoxia and oxidative stress responses, together with lower expression of proteins involved in growth signaling, anticoagulant regulation, and lipid homeostasis. MeaningSevere undernutrition is associated with a distinct high-risk clinical and biologic phenotype during severe infection, with implications for nutritional support, risk stratification, and host-directed therapeutic strategies, particularly in low- and middle-income countries.
Wan, Y. I.; Pearse, R. M.; Prowle, J. R.
Show abstract
Objective To examine the impact of acute illness on long-term health and describe any differences in these associations between socioeconomic and ethnic groups. Design Longitudinal population study. Setting Linked primary and secondary care data recorded in the Clinical Practice Research Datalink (CPRD). Participants Adults ([≥]18 years) residing in England registered with a primary care general practice (GP) between 1st January 2012 and 31st December 2022 who have not opted out of inclusion into CPRD and linked data sources. Socioeconomic deprivation was defined using the Index of Multiple Deprivation (IMD) and ethnicity by UK census 2011 definitions. Main outcome measures The primary outcome was new long-term disease and multimorbidity (two or more long-term diseases). We describe incidence of hospitalisation for acute illness as the exposure. Results We included 18,329,659 people, with 9,339,394 (51.0%) women, 7,430,555 (40.5%) people from the most deprived deciles (IMD 1-4) and 3,009,717 (16.4%) from a minority ethnic group. 6,038,272 (32.9%) people experienced hospitalisation for acute illness. Hospitalisation was associated with increased onset of long-term disease in those alive at the end of follow up (41.1% hospitalised vs 18.7% not hospitalised; adjusted HR 2.48 (2.47 to 2.48)). Compared to non-hospitalised, those who had been hospitalised were more likely to change from being disease free at baseline to having a new long-term disease (12.9% vs. 7.5%), develop multimorbidity (4.7% vs. 1.1%), or transition to multimorbidity if they had pre-existing disease (8.1% vs. 1.8%). Age-standardised hospitalisation rates were highest in the most deprived decile and in people with Black ethnicity. Comparative hospitalisation ratio for IMD 1 compared to IMD 10 ranging from 1.78 in 2018 to 1.96 in 2021 and for Black ethnicity compared to White ranging from 1.03 in 2017 to 1.08 in 2021. Conclusions Acute hospitalisation is a key stage in the development of long-term disease and may be an underutilised opportunity for intervention to change healthy life trajectory and reduce health inequality.
Nilsson, A.; da Silva, M.; Le, H. T.; Haggstrom, C.; Wahlstrom, J.; Michaelsson, K.; Trolle Lagerros, Y.; Sandin, S.; Magnusson, P. K.; Fritz, J.; Stocks, T.
Show abstract
Excess body weight has been associated with increased cancer risk, but the role of weight change across adulthood remains unclear. We examined body weight trajectories from ages 17 to 60 and their associations with site-specific cancer incidence. Data were based on the ODDS study, a pooled, nationwide cohort study in Sweden, with data on weight spanning 1911 to 2020, and cancer follow-up through 2023. Weight trajectories were estimated with linear mixed effects models in individuals with at least three weight measurements. Cox regressions estimated hazard ratios for associations between weight trajectories and established and potentially obesity-related cancers. Fifth versus first quintile of weight change was associated with many cancers, most strongly with esophageal adenocarcinoma in men (HR 2.25; 95% CI 1.66-3.04), liver cancer in men (HR 2.67; 95% CI 2.15-3.33), endometrial cancer in women (HR 3.78; 95% CI 3.09-4.61), and pituitary tumors in both sexes (men: HR 3.13 [95% CI 2.13-4.61]; women: HR 2.13 [95% CI 1.41-3.22]). Associations varied by sex and age. Heavier weight at age 17 years and earlier obesity onset were also associated with higher cancer incidence. These findings highlight the importance of a life-course approach to weight management and support sex- and age-targeted cancer prevention strategies.
Decker, J. E.; Morales, K. H.; Chen, P.-W.; Master, L.; Kwon, M.; Jansen, E. C.; Zemel, B. S.; Mitchell, J. A.
Show abstract
Background: The timing of energy intake could be important in the development of obesity. However, most observational evidence stems from adults, anthropometric defined obesity outcomes, single meal timing phenotyping, and traditional regression modeling. Objective: We aimed to describe meal timing patterns in adolescents and determine if they associated with fat mass by modeling the median and all other percentiles of the frequency distribution. Methods: We analyzed data from the Sleep and Growth Study 2 (S-Grow2, N=286, 12-13y). Participants completed 3-day 24-hour dietary recalls and time stamped eating occasions were used to define 8 meal timing traits, with aide from self-reported wake and bed timing. Principal component analysis (PCA) identified multi-dimensional meal timing patterns. Fat mass index (FMI) was estimated using dual energy X-ray absorptiometry. Quantile regression assessed if there were associations between meal timing traits and FMI across the entire FMI frequency distribution. Results: The typical first and last eating occasions were 8:00am (40 minutes after waking) and 8:00pm (2.7 hours before sleep), respectively, thus the eating period typically lasted 11.5 hours per day. The typical eating period midpoint was 2:15pm, and the timing when 50% of energy intake was consumed typically occurred at 3:15pm. PCA revealed three meal timing patterns: 1) Delayed Start, Condensed Eating Period (43% of variance; shorter eating period and delayed timing of first eating); 2) Late, Sleep Proximal Eating (30% of variance; later timing of last eating and extended eating period), and 3) Later Energy Intake (10% of variance; delayed energy intake midpoint). Higher scores for the Delayed Start, Condensed Eating Period pattern associated with higher body mass index and FMI at the upper tails of their distributions. Conclusions: Distinct multidimensional meal timing patterns emerged in early adolescence, with the delayed start, condensed eating period pattern potentially associated with higher adiposity.
Yoshimoto, H.; Hadano, T.; Shimada, K.; Gosho, M.; Fukuda, T.; Komano, Y.; Umeda, K.; Iwase, M.; Kusano, Y.; Kawabata, T.
Show abstract
BackgroundPractical alcohol risk-reduction strategies are widely recommended in public-facing alcohol guidance, but randomized evidence from socially interactive drinking episodes remains limited. We conducted a pilot cluster randomized trial to evaluate the feasibility and preliminary effects of a package intervention comprising practical drinking-strategy information, participant self-selection of same-day strategies, and a brief commitment declaration in a social drinking laboratory. MethodsThis single-center, parallel-group pilot trial was conducted in Japan. Pre-existing social groups participated. One or two groups scheduled in the same session slot were combined into a time-slot allocation unit, which was randomized 1:1 either to the package intervention or to alcohol-related knowledge only. The primary outcome was total pure alcohol intake during the first 120 min. Session satisfaction on a Visual Analog Scale (VAS) was a prespecified secondary participant-experience outcome. ResultsOf 83 interested individuals, 63 were randomized and 59 participants in 17 social groups and 12 allocation units were included in the modified intention-to-treat analysis. The mean paired intervention-control difference for 120-min alcohol intake was-8.84 g (95% confidence interval [CI]-27.92 to 10.23; exact sign-flip p = 0.281). The corresponding exploratory 0-30 min difference was-4.90 g (95% CI-10.48 to 0.68; exact sign-flip p = 0.094). In a genotype-adjusted participant-level sensitivity analysis, the intervention coefficient for 120-min intake was-16.0 g (95% CI-30.9 to-1.1; p = 0.036). Session satisfaction was high in both arms with no clear between-arm difference. Next-day follow-up was 100%, and no adverse-event-related discontinuations occurred. ConclusionsThe intervention was feasible to deliver in a socially interactive drinking setting, and session satisfaction was high in both arms. Primary allocation-unit estimates favored lower alcohol intake but were imprecise. Larger trials are needed to estimate effects more precisely, while considering the potential influence of genotype imbalance on effect estimation in East Asian samples. Trial registrationUniversity Hospital Medical Information Network Clinical Trials Registry (UMIN-CTR) UMIN000060685. Registered 17 February 2026.
Shah, A.; Chandramouli, A.; Abhayakumar, A.; Rajmani, R. S.; Kamat, S. S.; Balaji, K. N.
Show abstract
Mycobacterium tuberculosis (Mtb) subverts host immune responses via modulation of host epigenome and metabolism. In this study, we underscore a role for the epigenetic modifier, Lysine Specific Demethylase 1 (LSD1), in regulating macrophage metabolism to support mycobacterial pathogenesis. In ex vivo and in vivo infection models, LSD1 inhibition reduced mycobacterial CFU alleviating lung pathology. Metabolomic analysis of Mtb infected, LSD1 deficient macrophages revealed increased levels of alpha-ketoglutarate (AKG), a crucial TCA cycle metabolite via regulating genes implicated in glutamine breakdown. Moreover, exogenous addition of AKG resulted in reduced oxidative stress and attenuated lipid peroxidation (LPO) with a consequent decrease in Mtb survival. Blocking glutamine breakdown in LSD1 deficient macrophages failed to reduce LPO and promoted Mtb intracellular survival, highlighting the role of LSD1-AKG axis in this immunomodulation. Dietary supplementation of AKG to Mtb infected mice improved lung pathology, limited Mtb dissemination and reduced the levels of oxidative Malondialdehyde adducts. Therefore, we highlight a host protective role of AKG during Mtb pathogenesis through suppression of lipid peroxidation and uncover an epigenetic-metabolic axis exploited by Mtb, thereby positing dietary supplementation of AKG as a potential therapeutic strategy against Tuberculosis.
Goodman, M. L.; Maknojia, S.; Sciba, A.; Robertson, D.; Keiser, P.
Show abstract
Background: Opioid-related mortality in Texas has escalated dramatically, increasingly driven by illicitly manufactured fentanyl. To address local surges in mortality, the Galveston County Health District deployed the Galveston County Opioid Defense Effort (GCODE) in July 2023, leveraging digitally integrated surveillance data from emergency medical services (EMS) and the Medical Examiner to provide targeted naloxone distribution in identified overdose hot spots. Methods: Using a segmented interrupted time series (ITS) design and Poisson regression with robust standard errors, we evaluated the population-level impact of GCODE on opioid-involved mortality through the end of 2025. Data were sourced from the Galveston Area Ambulance Authority (GAAA) and vital statistics (ICD-10 codes). We assessed mortality trajectory changes, the observed fatality ratio among EMS-detected opioid events (the Survival Gap), and demographic and geographic covariates. Results: The Poisson ITS model included 519 weekly observations (N = 14,827 tract-weeks across 101 census tracts). Pre-intervention, opioid mortality increased by 0.16% weekly (IRR = 1.0016; 95% CI: 1.000-1.003; p = 0.011). Following GCODE deployment, the mortality trajectory reversed to a sustained 0.55% weekly decrease (IRR = 0.9945; 95% CI: 0.990-0.999; p = 0.021). The observed fatality ratio among EMS-detected events declined from 7.59% (preintervention mean; SD = 0.111) to 1.71% (post-intervention; SD = 0.042; Chi^2 = 19.824; p = 0.0001). Opioid decedents were significantly younger than the general mortality population (OR = 0.945 per year of age; p < 0.001), and were descriptively more likely to lack documented race/ethnicity data (41.23% vs. 8.27% Unknown; p < 0.001), limiting equity analysis. Conclusions: The findings are consistent with GCODE having meaningfully reduced opioid mortality by substantially lowering event-level lethality. These results suggest that targeted, digitally coordinated harm reduction can decouple overdose incidence from fatal outcomes, with implications for harm reduction program design in structurally constrained environments.
Harguindeguy, I.; Assandri, M.; Daza Millone, A.; Cavalitto, S.; Serradell, M.; Ortiz, G.
Show abstract
Immunocastration, a non surgical strategy based on active immunization against gonadotropin-releasing hormone (GnRH), effectively suppresses steroidogenesis and spermatogenesis. However, peptide vaccines targeting poorly immunogenic antigens such as GnRH often fail to elicit robust adaptive immune responses, requiring adjuvants or carrier proteins. Previously, we introduced Coated Bacterial Vaccines (CBVs), a platform that uses chemically inactivated Gram-positive bacteria to display recombinant antigens fused to the SlpA carboxy terminal domain (dSLPA) on their surface. This system leverages natural pathogen associated molecular patterns (PAMPs) to enhance immunogenicity without additional adjuvants. In this work, we extended the application of the CBVs platform to enhance the immune response against a poorly immunogenic GnRH-based peptide vaccine. GnRH-CBVs were formulated using inactivated Bacillus subtilis var. natto coated with a recombinant GnRH tandem repeat dSLPA fusion protein and administered to male BALB/c mice. A chitosan-adjuvanted GnRH dSLPA formulation served as a positive control. GnRH-CBVs induced a strong Th2-biased humoral response, characterized by predominant IgG1 levels comparable to those achieved with chitosan. The resulting antibodies effectively neutralized endogenous GnRH, reducing steroidogenesis and spermatogenesis and inducing marked testicular histological alterations. These findings support CBVs as a promising strategy to enhance peptide vaccine immunogenicity for veterinary immunocastration.